Syntax-Directed Amorphous Slicing
نویسندگان
چکیده
منابع مشابه
Amorphous program slicing
Traditional, syntax-preserving program slicing simplifies a program by deleting components (e.g., statements and predicates) that do not affect a computation of interest. Amorphous slicing removes the limitation to component deletion as the only means of simplification, while retaining the semantic property that a slice preserves the selected behaviour of interest from the original program. Thi...
متن کاملAnalysis of Dynamic Memory Access Using Amorphous Slicing
Problems associated with understanding, verifying and reengineering the way in which a system allocates and releases dynamic memory present significant challenges to the software maintainer. Because the questions underlying these problems are undecidable, no system can provide a completely fail safe certification. For example, in checking for memory leaks, a system can only warn of potential pr...
متن کاملSyntax Directed Translation with LR Parsing
Abstrac t . A simple extension of the usual LR parser construction is made in order to build a translator. The LR parsing algorithm is extended by a facility to do output operations within the action shift and reduce. A class of translation grammars, called R-translation grammars, is introduced as an extension of the class of postfix translation grammars. Transformations called shaking-down and...
متن کاملSyntax-Directed Variational Autoencoder for Molecule Generation
Deep generative models have been enjoying success in modeling continuous data. However it remains challenging to capture the representations for discrete structures with formal grammars and semantics. How to generate both syntactically and semantically correct data still remains largely an open problem. Inspired by the theory of compiler where syntax and semantics check is done via syntax-direc...
متن کاملSyntax-Directed Attention for Neural Machine Translation
Attention mechanism, including global attention and local attention, plays a key role in neural machine translation (NMT). Global attention attends to all source words for word prediction. In comparison, local attention selectively looks at fixed-window source words. However, alignment weights for the current target word often decrease to the left and right by linear distance centering on the a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Automated Software Engineering
سال: 2004
ISSN: 0928-8910
DOI: 10.1023/b:ause.0000008667.37988.11